Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
نویسندگان
چکیده
We introduce two algorithms for nonconvex regularized finite sum minimization, where typical Lipschitz differentiability assumptions are relaxed to the notion of relative smoothness. The first one is a Bregman extension Finito/MISO, studied fully problems when sampling random, or under convexity nonsmooth term it essentially cyclic. second algorithm low-memory variant, in spirit SVRG and SARAH, that also allows formulations. Our analysis made remarkably simple by employing Moreau envelope as Lyapunov function. In randomized case, linear convergence established cost function strongly convex, yet with no requirements on individual functions sum. For cyclic variants, global results satisfies Kurdyka-\L ojasiewicz property.
منابع مشابه
A coordinate gradient descent method for ℓ1-regularized convex minimization
In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing `1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated a...
متن کاملProximal Stochastic Methods for Nonsmooth Nonconvex Finite-Sum Optimization
We analyze stochastic algorithms for optimizing nonconvex, nonsmooth finite-sum problems, where the nonsmooth part is convex. Surprisingly, unlike the smooth case, our knowledge of this fundamental problem is very limited. For example, it is not known whether the proximal stochastic gradient method with constant minibatch converges to a stationary point. To tackle this issue, we develop fast st...
متن کاملLinearized Bregman for l1-regularized Logistic Regression
Sparse logistic regression is an important linear classifier in statistical learning, providing an attractive route for feature selection. A popular approach is based on minimizing an l1-regularization term with a regularization parameter λ that affects the solution sparsity. To determine an appropriate value for the regularization parameter, one can apply the grid search method or the Bayesian...
متن کاملIterative Bregman Projections for Regularized Transportation Problems
This article details a general numerical framework to approximate solutions to linear programs related to optimal transport. The general idea is to introduce an entropic regularization of the initial linear program. This regularized problem corresponds to a Kullback-Leibler Bregman divergence projection of a vector (representing some initial joint distribution) on the polytope of constraints. W...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Siam Journal on Optimization
سال: 2022
ISSN: ['1095-7189', '1052-6234']
DOI: https://doi.org/10.1137/21m140376x